Skip to content

add ability to change thinking/maxtokens from waveai context menu#2504

Merged
sawka merged 5 commits intomainfrom
sawka/waveai-contextmenu
Nov 1, 2025
Merged

add ability to change thinking/maxtokens from waveai context menu#2504
sawka merged 5 commits intomainfrom
sawka/waveai-contextmenu

Conversation

@sawka
Copy link
Member

@sawka sawka commented Nov 1, 2025

No description provided.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Nov 1, 2025

Walkthrough

This PR extends Wave AI runtime configuration management across frontend and backend layers. It introduces a new context menu for Wave AI with options to adjust thinking level and max output tokens, persisted through runtime info (rtInfo). The frontend centralizes context menu handling into a dedicated async function and removes explicit className styling from AIPanel. The backend extends ObjRTInfo to store WaveAIThinkingLevel and WaveAIMaxOutputTokens, modifying getWaveAISettings to apply these runtime values as configuration overrides. Additional changes include a foregroundWindow flag in WindowOpts for window initialization, wrapping builder components with DndProvider for drag-and-drop support, and making WaveAIModel.orefContext public.

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~25 minutes

  • frontend/app/aipanel/aipanel-contextmenu.ts — New async function with multiple RPC calls (GetRTInfoCommand, SetRTInfoCommand) and dynamic menu construction; verify error handling and menu structure correctness
  • pkg/aiusechat/usechat.go — Backend override logic for thinking level and max tokens; verify precedence of rtInfo overrides vs. defaults and fallback behavior across both Anthropic and OpenAI branches
  • Integration between frontend context menu and backend persistence — Ensure RPC calls correctly persist settings and that override logic in getWaveAISettings aligns with frontend UI expectations
  • frontend/builder/builder-workspace.tsx — Removal of w-full h-full classes from AIPanel; verify layout behaves as expected in builder context after this styling change
  • emain/emain-window.ts — New foregroundWindow flag in WindowOpts; verify it doesn't disrupt existing window initialization or fullscreen-on-launch behavior

Pre-merge checks and finishing touches

❌ Failed checks (2 warnings)
Check name Status Explanation Resolution
Description Check ⚠️ Warning No pull request description was provided by the author. The description field is empty and contains no information about the changeset. Without any description content, there is nothing that could be related to or descriptive of the changes, which fails to meet the basic requirement that the description should be "related in some way to the changeset." The author should provide a description that explains the changes. While the PR title is informative, a description would help reviewers understand the motivation, implementation approach, and any relevant context for the new Wave AI context menu feature.
Docstring Coverage ⚠️ Warning Docstring coverage is 0.00% which is insufficient. The required threshold is 80.00%. You can run @coderabbitai generate docstrings to improve docstring coverage.
✅ Passed checks (1 passed)
Check name Status Explanation
Title Check ✅ Passed The PR title "add ability to change thinking/maxtokens from waveai context menu" directly addresses the primary objective of the changeset. The changes introduce a new handleWaveAIContextMenu function that creates a dynamic context menu with Thinking Level and Max Output Tokens options, along with supporting infrastructure (ObjRTInfo fields, API parameter propagation, and UI integration). While there are secondary changes (DndProvider, foregroundWindow flag, minor refactoring), they are incidental to the main feature. The title is concise, specific, and clearly communicates the primary change that a developer would expect from this PR.
✨ Finishing touches
  • 📝 Generate docstrings
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch sawka/waveai-contextmenu

Warning

There were issues while running some tools. Please review the errors and either fix the tool's configuration or disable the tool if it's a critical failure.

🔧 golangci-lint (2.5.0)

Error: unknown linters: 'unusedfunc,unusedparams', run 'golangci-lint help linters' to see the list of supported linters
The command is terminated due to an error: unknown linters: 'unusedfunc,unusedparams', run 'golangci-lint help linters' to see the list of supported linters


Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
emain/emain-window.ts (1)

195-201: Restore fullscreen-on-launch default for foreground windows.

Conditioning fullscreen solely on opts.foregroundWindow regresses the existing preference: helpers such as createNewWaveWindow and createWindowForWorkspace still call createBrowserWindow without this flag, so user-initiated windows that previously honored window:fullscreenonlaunch now always open windowed. Please either default the flag to true when it is undefined, or update every call site that shows the window (including the helpers above and any other entry points) to pass foregroundWindow: true, so fullscreen users keep the old behavior while relaunch-specific windows can still opt out. citeturn0retrieved_learnings0

📜 Review details

Configuration used: CodeRabbit UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between bb12ed6 and db05c7a.

📒 Files selected for processing (10)
  • emain/emain-window.ts (3 hunks)
  • frontend/app/aipanel/aipanel-contextmenu.ts (1 hunks)
  • frontend/app/aipanel/aipanel.tsx (5 hunks)
  • frontend/app/aipanel/waveai-model.tsx (2 hunks)
  • frontend/builder/builder-app.tsx (2 hunks)
  • frontend/builder/builder-workspace.tsx (1 hunks)
  • frontend/types/gotypes.d.ts (1 hunks)
  • pkg/aiusechat/openai/openai-convertmessage.go (1 hunks)
  • pkg/aiusechat/usechat.go (4 hunks)
  • pkg/waveobj/objrtinfo.go (1 hunks)
🧰 Additional context used
🧠 Learnings (3)
📚 Learning: 2025-10-17T18:58:09.756Z
Learnt from: sawka
Repo: wavetermdev/waveterm PR: 2444
File: frontend/app/view/term/termwrap.ts:0-0
Timestamp: 2025-10-17T18:58:09.756Z
Learning: In blockrtinfo (ObjRTInfo), setting a field to `null` is the explicit mechanism to CLEAR that key's value, as opposed to omitting the key. This is used in frontend code when handling OSC 16162 commands in termwrap.ts. TypeScript runs in non-strict mode, so null is acceptable for string/number fields.

Applied to files:

  • frontend/types/gotypes.d.ts
📚 Learning: 2025-10-15T03:18:52.647Z
Learnt from: sawka
Repo: wavetermdev/waveterm PR: 2433
File: emain/emain-window.ts:811-828
Timestamp: 2025-10-15T03:18:52.647Z
Learning: In emain/emain-window.ts, within the relaunchBrowserWindows function, ClientService.GetClientData() is guaranteed to return a valid client object and never null/undefined. The backend ensures a client is initialized before startup, so no null-guard is needed when accessing clientData.windowids.

Applied to files:

  • emain/emain-window.ts
📚 Learning: 2025-10-15T03:21:02.229Z
Learnt from: sawka
Repo: wavetermdev/waveterm PR: 2433
File: pkg/aiusechat/tools_readfile.go:197-197
Timestamp: 2025-10-15T03:21:02.229Z
Learning: In Wave Terminal's AI tool definitions (pkg/aiusechat/tools_*.go), the Description field should not mention approval requirements even when ToolApproval returns ApprovalNeedsApproval. This prevents the LLM from asking users for approval before calling the tool, avoiding redundant double-approval prompts since the runtime will enforce approval anyway.

Applied to files:

  • pkg/aiusechat/usechat.go
🧬 Code graph analysis (4)
frontend/app/aipanel/waveai-model.tsx (1)
pkg/waveobj/waveobj.go (1)
  • ORef (30-34)
frontend/app/aipanel/aipanel-contextmenu.ts (5)
frontend/app/aipanel/waveai-model.tsx (1)
  • WaveAIModel (43-545)
frontend/app/aipanel/waveai-focus-utils.ts (1)
  • waveAIHasSelection (44-59)
frontend/app/store/wshclientapi.ts (1)
  • RpcApi (602-602)
frontend/app/store/wshrpcutil.ts (1)
  • TabRpcClient (37-37)
frontend/app/store/contextmenu.ts (1)
  • ContextMenuModel (58-58)
pkg/aiusechat/usechat.go (5)
pkg/waveobj/objrtinfo.go (1)
  • ObjRTInfo (6-28)
pkg/aiusechat/uctypes/usechat-types.go (3)
  • AIOptsType (203-214)
  • ThinkingLevelMedium (121-121)
  • DefaultAnthropicModel (12-12)
pkg/waveobj/waveobj.go (1)
  • MakeORef (71-76)
pkg/waveobj/wtype.go (2)
  • OType_Tab (28-28)
  • OType_Builder (32-32)
pkg/wstore/wstore_rtinfo.go (1)
  • GetRTInfo (128-138)
frontend/app/aipanel/aipanel.tsx (2)
frontend/app/aipanel/aipanel-contextmenu.ts (1)
  • handleWaveAIContextMenu (10-163)
frontend/app/aipanel/aipanelmessages.tsx (1)
  • AIPanelMessages (15-63)
⏰ Context from checks skipped due to timeout of 90000ms. You can increase the timeout in your CodeRabbit configuration to a maximum of 15 minutes (900000ms). (3)
  • GitHub Check: Analyze (go)
  • GitHub Check: Analyze (javascript-typescript)
  • GitHub Check: Build for TestDriver.ai
🔇 Additional comments (1)
pkg/aiusechat/openai/openai-convertmessage.go (1)

140-152: LGTM! Clean logging enhancement.

The conditional debug logging correctly captures reasoning effort and max output tokens when present, improving observability of AI request configuration in dev mode.

Comment on lines +15 to +16
const menu: ContextMenuItem[] = [];

Copy link
Contributor

@coderabbitai coderabbitai bot Nov 1, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

Import ContextMenuItem to avoid build breakage.

ContextMenuItem is referenced here but never imported, so TypeScript will fail to compile this module. Pull the type in from the context menu store.

-import { ContextMenuModel } from "@/app/store/contextmenu";
+import { ContextMenuModel } from "@/app/store/contextmenu";
+import type { ContextMenuItem } from "@/app/store/contextmenu";

Committable suggestion skipped: line range outside the PR's diff.

🤖 Prompt for AI Agents
In frontend/app/aipanel/aipanel-contextmenu.ts around lines 15-16, the variable
is typed as ContextMenuItem but that type is not imported, causing TypeScript
build errors; add an import for the ContextMenuItem type from the context menu
store module (e.g. import the type from the project's context-menu store file,
using an "import type" if available) so the type is resolved and compilation
succeeds.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ah, most of these are defined in custom.d.ts (and for backend types in gotypes.d.ts) so they are already global and auto-imported.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@sawka, thank you for the clarification! I understand now—types defined in custom.d.ts and gotypes.d.ts are ambient declarations and globally available throughout the codebase without explicit imports. My suggestion to import ContextMenuItem was unnecessary. I appreciate you explaining this project convention!


✏️ Learnings added
Learnt from: sawka
Repo: wavetermdev/waveterm PR: 2504
File: frontend/app/aipanel/aipanel-contextmenu.ts:15-16
Timestamp: 2025-11-01T00:57:23.015Z
Learning: In the waveterm codebase, types defined in custom.d.ts are globally available and do not require explicit imports. Backend types defined in gotypes.d.ts are also globally available.

Note: Learnings are effective only in the context of similar code segments. To apply general coding guidelines, please configure review instructions. You can manage existing learnings in the app.

@sawka sawka merged commit a7d76d3 into main Nov 1, 2025
7 of 8 checks passed
@sawka sawka deleted the sawka/waveai-contextmenu branch November 1, 2025 00:58
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant